I: pbuilder: network access will be disabled during build I: Current time: Wed Jan 21 13:36:58 +14 2026 I: pbuilder-time-stamp: 1768952218 I: Building the build Environment I: extracting base tarball [/var/cache/pbuilder/trixie-reproducible-base.tgz] I: copying local configuration W: --override-config is not set; not updating apt.conf Read the manpage for details. I: mounting /proc filesystem I: mounting /sys filesystem I: creating /{dev,run}/shm I: mounting /dev/pts filesystem I: redirecting /dev/ptmx to /dev/pts/ptmx I: policy-rc.d already exists I: Copying source file I: copying [sparql-wrapper-python_2.0.0-2.dsc] I: copying [./sparql-wrapper-python_2.0.0.orig.tar.gz] I: copying [./sparql-wrapper-python_2.0.0-2.debian.tar.xz] I: Extracting source gpgv: Signature made Wed Jun 26 07:57:36 2024 gpgv: using RSA key 8F6DE104377F3B11E741748731F3144544A1741A gpgv: issuer "tchet@debian.org" gpgv: Can't check signature: No public key dpkg-source: warning: cannot verify inline signature for ./sparql-wrapper-python_2.0.0-2.dsc: no acceptable signature found dpkg-source: info: extracting sparql-wrapper-python in sparql-wrapper-python-2.0.0 dpkg-source: info: unpacking sparql-wrapper-python_2.0.0.orig.tar.gz dpkg-source: info: unpacking sparql-wrapper-python_2.0.0-2.debian.tar.xz I: Not using root during the build. I: Installing the build-deps I: user script /srv/workspace/pbuilder/2595058/tmp/hooks/D01_modify_environment starting debug: Running on infom02-amd64. I: Changing host+domainname to test build reproducibility I: Adding a custom variable just for the fun of it... I: Changing /bin/sh to bash '/bin/sh' -> '/bin/bash' lrwxrwxrwx 1 root root 9 Jan 20 23:37 /bin/sh -> /bin/bash I: Setting pbuilder2's login shell to /bin/bash I: Setting pbuilder2's GECOS to second user,second room,second work-phone,second home-phone,second other I: user script /srv/workspace/pbuilder/2595058/tmp/hooks/D01_modify_environment finished I: user script /srv/workspace/pbuilder/2595058/tmp/hooks/D02_print_environment starting I: set BASH=/bin/sh BASHOPTS=checkwinsize:cmdhist:complete_fullquote:extquote:force_fignore:globasciiranges:globskipdots:hostcomplete:interactive_comments:patsub_replacement:progcomp:promptvars:sourcepath BASH_ALIASES=() BASH_ARGC=() BASH_ARGV=() BASH_CMDS=() BASH_LINENO=([0]="12" [1]="0") BASH_LOADABLES_PATH=/usr/local/lib/bash:/usr/lib/bash:/opt/local/lib/bash:/usr/pkg/lib/bash:/opt/pkg/lib/bash:. BASH_SOURCE=([0]="/tmp/hooks/D02_print_environment" [1]="/tmp/hooks/D02_print_environment") BASH_VERSINFO=([0]="5" [1]="2" [2]="37" [3]="1" [4]="release" [5]="x86_64-pc-linux-gnu") BASH_VERSION='5.2.37(1)-release' BUILDDIR=/build/reproducible-path BUILDUSERGECOS='second user,second room,second work-phone,second home-phone,second other' BUILDUSERNAME=pbuilder2 BUILD_ARCH=amd64 DEBIAN_FRONTEND=noninteractive DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=12 ' DIRSTACK=() DISTRIBUTION=trixie EUID=0 FUNCNAME=([0]="Echo" [1]="main") GROUPS=() HOME=/root HOSTNAME=i-capture-the-hostname HOSTTYPE=x86_64 HOST_ARCH=amd64 IFS=' ' INVOCATION_ID=260704dc8c08472dbf3c1dce3b447c19 LANG=C LANGUAGE=et_EE:et LC_ALL=C MACHTYPE=x86_64-pc-linux-gnu MAIL=/var/mail/root OPTERR=1 OPTIND=1 OSTYPE=linux-gnu PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path PBCURRENTCOMMANDLINEOPERATION=build PBUILDER_OPERATION=build PBUILDER_PKGDATADIR=/usr/share/pbuilder PBUILDER_PKGLIBDIR=/usr/lib/pbuilder PBUILDER_SYSCONFDIR=/etc PIPESTATUS=([0]="0") POSIXLY_CORRECT=y PPID=2595058 PS4='+ ' PWD=/ SHELL=/bin/bash SHELLOPTS=braceexpand:errexit:hashall:interactive-comments:posix SHLVL=3 SUDO_COMMAND='/usr/bin/timeout -k 24.1h 24h /usr/bin/ionice -c 3 /usr/bin/nice -n 11 /usr/bin/unshare --uts -- /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.N8BJLpcs/pbuilderrc_8YQR --distribution trixie --hookdir /etc/pbuilder/rebuild-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/trixie-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.N8BJLpcs/b2 --logfile b2/build.log sparql-wrapper-python_2.0.0-2.dsc' SUDO_GID=109 SUDO_UID=104 SUDO_USER=jenkins TERM=unknown TZ=/usr/share/zoneinfo/Etc/GMT-14 UID=0 USER=root _='I: set' I: uname -a Linux i-capture-the-hostname 6.11.5+bpo-amd64 #1 SMP PREEMPT_DYNAMIC Debian 6.11.5-1~bpo12+1 (2024-11-11) x86_64 GNU/Linux I: ls -l /bin lrwxrwxrwx 1 root root 7 Nov 22 2024 /bin -> usr/bin I: user script /srv/workspace/pbuilder/2595058/tmp/hooks/D02_print_environment finished -> Attempting to satisfy build-dependencies -> Creating pbuilder-satisfydepends-dummy package Package: pbuilder-satisfydepends-dummy Version: 0.invalid.0 Architecture: amd64 Maintainer: Debian Pbuilder Team Description: Dummy package to satisfy dependencies with aptitude - created by pbuilder This package was created automatically by pbuilder to satisfy the build-dependencies of the package being currently built. Depends: debhelper-compat (= 13), dh-sequence-python3, python3-all, python3-pytest, python3-setuptools, python3-rdflib dpkg-deb: building package 'pbuilder-satisfydepends-dummy' in '/tmp/satisfydepends-aptitude/pbuilder-satisfydepends-dummy.deb'. Selecting previously unselected package pbuilder-satisfydepends-dummy. (Reading database ... 19959 files and directories currently installed.) Preparing to unpack .../pbuilder-satisfydepends-dummy.deb ... Unpacking pbuilder-satisfydepends-dummy (0.invalid.0) ... dpkg: pbuilder-satisfydepends-dummy: dependency problems, but configuring anyway as you requested: pbuilder-satisfydepends-dummy depends on debhelper-compat (= 13); however: Package debhelper-compat is not installed. pbuilder-satisfydepends-dummy depends on dh-sequence-python3; however: Package dh-sequence-python3 is not installed. pbuilder-satisfydepends-dummy depends on python3-all; however: Package python3-all is not installed. pbuilder-satisfydepends-dummy depends on python3-pytest; however: Package python3-pytest is not installed. pbuilder-satisfydepends-dummy depends on python3-setuptools; however: Package python3-setuptools is not installed. pbuilder-satisfydepends-dummy depends on python3-rdflib; however: Package python3-rdflib is not installed. Setting up pbuilder-satisfydepends-dummy (0.invalid.0) ... Reading package lists... Building dependency tree... Reading state information... Initializing package states... Writing extended state information... Building tag database... pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) The following NEW packages will be installed: autoconf{a} automake{a} autopoint{a} autotools-dev{a} bsdextrautils{a} debhelper{a} dh-autoreconf{a} dh-python{a} dh-strip-nondeterminism{a} dwz{a} file{a} gettext{a} gettext-base{a} groff-base{a} intltool-debian{a} libarchive-zip-perl{a} libcom-err2{a} libdebhelper-perl{a} libelf1t64{a} libexpat1{a} libfile-stripnondeterminism-perl{a} libgssapi-krb5-2{a} libicu72{a} libk5crypto3{a} libkeyutils1{a} libkrb5-3{a} libkrb5support0{a} libmagic-mgc{a} libmagic1t64{a} libnsl2{a} libpipeline1{a} libpython3-stdlib{a} libpython3.12-minimal{a} libpython3.12-stdlib{a} libreadline8t64{a} libtirpc-common{a} libtirpc3t64{a} libtool{a} libuchardet0{a} libxml2{a} m4{a} man-db{a} media-types{a} netbase{a} po-debconf{a} python3{a} python3-all{a} python3-autocommand{a} python3-inflect{a} python3-iniconfig{a} python3-jaraco.context{a} python3-jaraco.functools{a} python3-jaraco.text{a} python3-minimal{a} python3-more-itertools{a} python3-packaging{a} python3-pkg-resources{a} python3-pluggy{a} python3-pyparsing{a} python3-pytest{a} python3-rdflib{a} python3-setuptools{a} python3-typeguard{a} python3-typing-extensions{a} python3-zipp{a} python3.12{a} python3.12-minimal{a} readline-common{a} sensible-utils{a} tzdata{a} The following packages are RECOMMENDED but will NOT be installed: ca-certificates curl krb5-locales libarchive-cpio-perl libltdl-dev libmail-sendmail-perl lynx python3-html5rdf python3-lxml python3-networkx python3-orjson python3-pygments wget 0 packages upgraded, 70 newly installed, 0 to remove and 0 not upgraded. Need to get 29.2 MB of archives. After unpacking 116 MB will be used. Writing extended state information... Get: 1 http://deb.debian.org/debian trixie/main amd64 libpython3.12-minimal amd64 3.12.8-3 [817 kB] Get: 2 http://deb.debian.org/debian trixie/main amd64 libexpat1 amd64 2.6.4-1 [106 kB] Get: 3 http://deb.debian.org/debian trixie/main amd64 python3.12-minimal amd64 3.12.8-3 [2162 kB] Get: 4 http://deb.debian.org/debian trixie/main amd64 python3-minimal amd64 3.12.6-1 [26.7 kB] Get: 5 http://deb.debian.org/debian trixie/main amd64 media-types all 10.1.0 [26.9 kB] Get: 6 http://deb.debian.org/debian trixie/main amd64 netbase all 6.4 [12.8 kB] Get: 7 http://deb.debian.org/debian trixie/main amd64 tzdata all 2024b-4 [256 kB] Get: 8 http://deb.debian.org/debian trixie/main amd64 libkrb5support0 amd64 1.21.3-3 [32.5 kB] Get: 9 http://deb.debian.org/debian trixie/main amd64 libcom-err2 amd64 1.47.2~rc1-2 [23.8 kB] Get: 10 http://deb.debian.org/debian trixie/main amd64 libk5crypto3 amd64 1.21.3-3 [79.9 kB] Get: 11 http://deb.debian.org/debian trixie/main amd64 libkeyutils1 amd64 1.6.3-4 [9092 B] Get: 12 http://deb.debian.org/debian trixie/main amd64 libkrb5-3 amd64 1.21.3-3 [324 kB] Get: 13 http://deb.debian.org/debian trixie/main amd64 libgssapi-krb5-2 amd64 1.21.3-3 [136 kB] Get: 14 http://deb.debian.org/debian trixie/main amd64 libtirpc-common all 1.3.4+ds-1.3 [10.9 kB] Get: 15 http://deb.debian.org/debian trixie/main amd64 libtirpc3t64 amd64 1.3.4+ds-1.3+b1 [83.1 kB] Get: 16 http://deb.debian.org/debian trixie/main amd64 libnsl2 amd64 1.3.0-3+b3 [40.6 kB] Get: 17 http://deb.debian.org/debian trixie/main amd64 readline-common all 8.2-6 [69.4 kB] Get: 18 http://deb.debian.org/debian trixie/main amd64 libreadline8t64 amd64 8.2-6 [169 kB] Get: 19 http://deb.debian.org/debian trixie/main amd64 libpython3.12-stdlib amd64 3.12.8-3 [1969 kB] Get: 20 http://deb.debian.org/debian trixie/main amd64 python3.12 amd64 3.12.8-3 [677 kB] Get: 21 http://deb.debian.org/debian trixie/main amd64 libpython3-stdlib amd64 3.12.6-1 [9692 B] Get: 22 http://deb.debian.org/debian trixie/main amd64 python3 amd64 3.12.6-1 [27.8 kB] Get: 23 http://deb.debian.org/debian trixie/main amd64 sensible-utils all 0.0.24 [24.8 kB] Get: 24 http://deb.debian.org/debian trixie/main amd64 libmagic-mgc amd64 1:5.45-3+b1 [314 kB] Get: 25 http://deb.debian.org/debian trixie/main amd64 libmagic1t64 amd64 1:5.45-3+b1 [108 kB] Get: 26 http://deb.debian.org/debian trixie/main amd64 file amd64 1:5.45-3+b1 [43.3 kB] Get: 27 http://deb.debian.org/debian trixie/main amd64 gettext-base amd64 0.22.5-2 [200 kB] Get: 28 http://deb.debian.org/debian trixie/main amd64 libuchardet0 amd64 0.0.8-1+b2 [68.9 kB] Get: 29 http://deb.debian.org/debian trixie/main amd64 groff-base amd64 1.23.0-6 [1184 kB] Get: 30 http://deb.debian.org/debian trixie/main amd64 bsdextrautils amd64 2.40.2-12 [92.0 kB] Get: 31 http://deb.debian.org/debian trixie/main amd64 libpipeline1 amd64 1.5.8-1 [42.0 kB] Get: 32 http://deb.debian.org/debian trixie/main amd64 man-db amd64 2.13.0-1 [1420 kB] Get: 33 http://deb.debian.org/debian trixie/main amd64 m4 amd64 1.4.19-4 [287 kB] Get: 34 http://deb.debian.org/debian trixie/main amd64 autoconf all 2.72-3 [493 kB] Get: 35 http://deb.debian.org/debian trixie/main amd64 autotools-dev all 20220109.1 [51.6 kB] Get: 36 http://deb.debian.org/debian trixie/main amd64 automake all 1:1.16.5-1.3 [823 kB] Get: 37 http://deb.debian.org/debian trixie/main amd64 autopoint all 0.22.5-2 [723 kB] Get: 38 http://deb.debian.org/debian trixie/main amd64 libdebhelper-perl all 13.20 [89.7 kB] Get: 39 http://deb.debian.org/debian trixie/main amd64 libtool all 2.4.7-8 [517 kB] Get: 40 http://deb.debian.org/debian trixie/main amd64 dh-autoreconf all 20 [17.1 kB] Get: 41 http://deb.debian.org/debian trixie/main amd64 libarchive-zip-perl all 1.68-1 [104 kB] Get: 42 http://deb.debian.org/debian trixie/main amd64 libfile-stripnondeterminism-perl all 1.14.0-1 [19.5 kB] Get: 43 http://deb.debian.org/debian trixie/main amd64 dh-strip-nondeterminism all 1.14.0-1 [8448 B] Get: 44 http://deb.debian.org/debian trixie/main amd64 libelf1t64 amd64 0.192-4 [189 kB] Get: 45 http://deb.debian.org/debian trixie/main amd64 dwz amd64 0.15-1+b1 [110 kB] Get: 46 http://deb.debian.org/debian trixie/main amd64 libicu72 amd64 72.1-5+b1 [9423 kB] Get: 47 http://deb.debian.org/debian trixie/main amd64 libxml2 amd64 2.12.7+dfsg+really2.9.14-0.2+b1 [699 kB] Get: 48 http://deb.debian.org/debian trixie/main amd64 gettext amd64 0.22.5-2 [1601 kB] Get: 49 http://deb.debian.org/debian trixie/main amd64 intltool-debian all 0.35.0+20060710.6 [22.9 kB] Get: 50 http://deb.debian.org/debian trixie/main amd64 po-debconf all 1.0.21+nmu1 [248 kB] Get: 51 http://deb.debian.org/debian trixie/main amd64 debhelper all 13.20 [915 kB] Get: 52 http://deb.debian.org/debian trixie/main amd64 python3-autocommand all 2.2.2-3 [13.6 kB] Get: 53 http://deb.debian.org/debian trixie/main amd64 python3-more-itertools all 10.5.0-1 [63.8 kB] Get: 54 http://deb.debian.org/debian trixie/main amd64 python3-typing-extensions all 4.12.2-2 [73.0 kB] Get: 55 http://deb.debian.org/debian trixie/main amd64 python3-typeguard all 4.4.1-1 [37.0 kB] Get: 56 http://deb.debian.org/debian trixie/main amd64 python3-inflect all 7.3.1-2 [32.4 kB] Get: 57 http://deb.debian.org/debian trixie/main amd64 python3-jaraco.context all 6.0.0-1 [7984 B] Get: 58 http://deb.debian.org/debian trixie/main amd64 python3-jaraco.functools all 4.1.0-1 [12.0 kB] Get: 59 http://deb.debian.org/debian trixie/main amd64 python3-pkg-resources all 75.2.0-1 [213 kB] Get: 60 http://deb.debian.org/debian trixie/main amd64 python3-jaraco.text all 4.0.0-1 [11.4 kB] Get: 61 http://deb.debian.org/debian trixie/main amd64 python3-zipp all 3.21.0-1 [10.6 kB] Get: 62 http://deb.debian.org/debian trixie/main amd64 python3-setuptools all 75.2.0-1 [731 kB] Get: 63 http://deb.debian.org/debian trixie/main amd64 dh-python all 6.20241024 [109 kB] Get: 64 http://deb.debian.org/debian trixie/main amd64 python3-all amd64 3.12.6-1 [1040 B] Get: 65 http://deb.debian.org/debian trixie/main amd64 python3-iniconfig all 1.1.1-2 [6396 B] Get: 66 http://deb.debian.org/debian trixie/main amd64 python3-packaging all 24.2-1 [55.3 kB] Get: 67 http://deb.debian.org/debian trixie/main amd64 python3-pluggy all 1.5.0-1 [26.9 kB] Get: 68 http://deb.debian.org/debian trixie/main amd64 python3-pyparsing all 3.1.2-1 [146 kB] Get: 69 http://deb.debian.org/debian trixie/main amd64 python3-pytest all 8.3.3-1 [249 kB] Get: 70 http://deb.debian.org/debian trixie/main amd64 python3-rdflib all 7.1.1-2 [472 kB] Fetched 29.2 MB in 1s (27.5 MB/s) debconf: delaying package configuration, since apt-utils is not installed Selecting previously unselected package libpython3.12-minimal:amd64. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 19959 files and directories currently installed.) Preparing to unpack .../libpython3.12-minimal_3.12.8-3_amd64.deb ... Unpacking libpython3.12-minimal:amd64 (3.12.8-3) ... Selecting previously unselected package libexpat1:amd64. Preparing to unpack .../libexpat1_2.6.4-1_amd64.deb ... Unpacking libexpat1:amd64 (2.6.4-1) ... Selecting previously unselected package python3.12-minimal. Preparing to unpack .../python3.12-minimal_3.12.8-3_amd64.deb ... Unpacking python3.12-minimal (3.12.8-3) ... Setting up libpython3.12-minimal:amd64 (3.12.8-3) ... Setting up libexpat1:amd64 (2.6.4-1) ... Setting up python3.12-minimal (3.12.8-3) ... Selecting previously unselected package python3-minimal. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 20279 files and directories currently installed.) Preparing to unpack .../00-python3-minimal_3.12.6-1_amd64.deb ... Unpacking python3-minimal (3.12.6-1) ... Selecting previously unselected package media-types. Preparing to unpack .../01-media-types_10.1.0_all.deb ... Unpacking media-types (10.1.0) ... Selecting previously unselected package netbase. Preparing to unpack .../02-netbase_6.4_all.deb ... Unpacking netbase (6.4) ... Selecting previously unselected package tzdata. Preparing to unpack .../03-tzdata_2024b-4_all.deb ... Unpacking tzdata (2024b-4) ... Selecting previously unselected package libkrb5support0:amd64. Preparing to unpack .../04-libkrb5support0_1.21.3-3_amd64.deb ... Unpacking libkrb5support0:amd64 (1.21.3-3) ... Selecting previously unselected package libcom-err2:amd64. Preparing to unpack .../05-libcom-err2_1.47.2~rc1-2_amd64.deb ... Unpacking libcom-err2:amd64 (1.47.2~rc1-2) ... Selecting previously unselected package libk5crypto3:amd64. Preparing to unpack .../06-libk5crypto3_1.21.3-3_amd64.deb ... Unpacking libk5crypto3:amd64 (1.21.3-3) ... Selecting previously unselected package libkeyutils1:amd64. Preparing to unpack .../07-libkeyutils1_1.6.3-4_amd64.deb ... Unpacking libkeyutils1:amd64 (1.6.3-4) ... Selecting previously unselected package libkrb5-3:amd64. Preparing to unpack .../08-libkrb5-3_1.21.3-3_amd64.deb ... Unpacking libkrb5-3:amd64 (1.21.3-3) ... Selecting previously unselected package libgssapi-krb5-2:amd64. Preparing to unpack .../09-libgssapi-krb5-2_1.21.3-3_amd64.deb ... Unpacking libgssapi-krb5-2:amd64 (1.21.3-3) ... Selecting previously unselected package libtirpc-common. Preparing to unpack .../10-libtirpc-common_1.3.4+ds-1.3_all.deb ... Unpacking libtirpc-common (1.3.4+ds-1.3) ... Selecting previously unselected package libtirpc3t64:amd64. Preparing to unpack .../11-libtirpc3t64_1.3.4+ds-1.3+b1_amd64.deb ... Adding 'diversion of /lib/x86_64-linux-gnu/libtirpc.so.3 to /lib/x86_64-linux-gnu/libtirpc.so.3.usr-is-merged by libtirpc3t64' Adding 'diversion of /lib/x86_64-linux-gnu/libtirpc.so.3.0.0 to /lib/x86_64-linux-gnu/libtirpc.so.3.0.0.usr-is-merged by libtirpc3t64' Unpacking libtirpc3t64:amd64 (1.3.4+ds-1.3+b1) ... Selecting previously unselected package libnsl2:amd64. Preparing to unpack .../12-libnsl2_1.3.0-3+b3_amd64.deb ... Unpacking libnsl2:amd64 (1.3.0-3+b3) ... Selecting previously unselected package readline-common. Preparing to unpack .../13-readline-common_8.2-6_all.deb ... Unpacking readline-common (8.2-6) ... Selecting previously unselected package libreadline8t64:amd64. Preparing to unpack .../14-libreadline8t64_8.2-6_amd64.deb ... Adding 'diversion of /lib/x86_64-linux-gnu/libhistory.so.8 to /lib/x86_64-linux-gnu/libhistory.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/x86_64-linux-gnu/libhistory.so.8.2 to /lib/x86_64-linux-gnu/libhistory.so.8.2.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/x86_64-linux-gnu/libreadline.so.8 to /lib/x86_64-linux-gnu/libreadline.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/x86_64-linux-gnu/libreadline.so.8.2 to /lib/x86_64-linux-gnu/libreadline.so.8.2.usr-is-merged by libreadline8t64' Unpacking libreadline8t64:amd64 (8.2-6) ... Selecting previously unselected package libpython3.12-stdlib:amd64. Preparing to unpack .../15-libpython3.12-stdlib_3.12.8-3_amd64.deb ... Unpacking libpython3.12-stdlib:amd64 (3.12.8-3) ... Selecting previously unselected package python3.12. Preparing to unpack .../16-python3.12_3.12.8-3_amd64.deb ... Unpacking python3.12 (3.12.8-3) ... Selecting previously unselected package libpython3-stdlib:amd64. Preparing to unpack .../17-libpython3-stdlib_3.12.6-1_amd64.deb ... Unpacking libpython3-stdlib:amd64 (3.12.6-1) ... Setting up python3-minimal (3.12.6-1) ... Selecting previously unselected package python3. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 21342 files and directories currently installed.) Preparing to unpack .../00-python3_3.12.6-1_amd64.deb ... Unpacking python3 (3.12.6-1) ... Selecting previously unselected package sensible-utils. Preparing to unpack .../01-sensible-utils_0.0.24_all.deb ... Unpacking sensible-utils (0.0.24) ... Selecting previously unselected package libmagic-mgc. Preparing to unpack .../02-libmagic-mgc_1%3a5.45-3+b1_amd64.deb ... Unpacking libmagic-mgc (1:5.45-3+b1) ... Selecting previously unselected package libmagic1t64:amd64. Preparing to unpack .../03-libmagic1t64_1%3a5.45-3+b1_amd64.deb ... Unpacking libmagic1t64:amd64 (1:5.45-3+b1) ... Selecting previously unselected package file. Preparing to unpack .../04-file_1%3a5.45-3+b1_amd64.deb ... Unpacking file (1:5.45-3+b1) ... Selecting previously unselected package gettext-base. Preparing to unpack .../05-gettext-base_0.22.5-2_amd64.deb ... Unpacking gettext-base (0.22.5-2) ... Selecting previously unselected package libuchardet0:amd64. Preparing to unpack .../06-libuchardet0_0.0.8-1+b2_amd64.deb ... Unpacking libuchardet0:amd64 (0.0.8-1+b2) ... Selecting previously unselected package groff-base. Preparing to unpack .../07-groff-base_1.23.0-6_amd64.deb ... Unpacking groff-base (1.23.0-6) ... Selecting previously unselected package bsdextrautils. Preparing to unpack .../08-bsdextrautils_2.40.2-12_amd64.deb ... Unpacking bsdextrautils (2.40.2-12) ... Selecting previously unselected package libpipeline1:amd64. Preparing to unpack .../09-libpipeline1_1.5.8-1_amd64.deb ... Unpacking libpipeline1:amd64 (1.5.8-1) ... Selecting previously unselected package man-db. Preparing to unpack .../10-man-db_2.13.0-1_amd64.deb ... Unpacking man-db (2.13.0-1) ... Selecting previously unselected package m4. Preparing to unpack .../11-m4_1.4.19-4_amd64.deb ... Unpacking m4 (1.4.19-4) ... Selecting previously unselected package autoconf. Preparing to unpack .../12-autoconf_2.72-3_all.deb ... Unpacking autoconf (2.72-3) ... Selecting previously unselected package autotools-dev. Preparing to unpack .../13-autotools-dev_20220109.1_all.deb ... Unpacking autotools-dev (20220109.1) ... Selecting previously unselected package automake. Preparing to unpack .../14-automake_1%3a1.16.5-1.3_all.deb ... Unpacking automake (1:1.16.5-1.3) ... Selecting previously unselected package autopoint. Preparing to unpack .../15-autopoint_0.22.5-2_all.deb ... Unpacking autopoint (0.22.5-2) ... Selecting previously unselected package libdebhelper-perl. Preparing to unpack .../16-libdebhelper-perl_13.20_all.deb ... Unpacking libdebhelper-perl (13.20) ... Selecting previously unselected package libtool. Preparing to unpack .../17-libtool_2.4.7-8_all.deb ... Unpacking libtool (2.4.7-8) ... Selecting previously unselected package dh-autoreconf. Preparing to unpack .../18-dh-autoreconf_20_all.deb ... Unpacking dh-autoreconf (20) ... Selecting previously unselected package libarchive-zip-perl. Preparing to unpack .../19-libarchive-zip-perl_1.68-1_all.deb ... Unpacking libarchive-zip-perl (1.68-1) ... Selecting previously unselected package libfile-stripnondeterminism-perl. Preparing to unpack .../20-libfile-stripnondeterminism-perl_1.14.0-1_all.deb ... Unpacking libfile-stripnondeterminism-perl (1.14.0-1) ... Selecting previously unselected package dh-strip-nondeterminism. Preparing to unpack .../21-dh-strip-nondeterminism_1.14.0-1_all.deb ... Unpacking dh-strip-nondeterminism (1.14.0-1) ... Selecting previously unselected package libelf1t64:amd64. Preparing to unpack .../22-libelf1t64_0.192-4_amd64.deb ... Unpacking libelf1t64:amd64 (0.192-4) ... Selecting previously unselected package dwz. Preparing to unpack .../23-dwz_0.15-1+b1_amd64.deb ... Unpacking dwz (0.15-1+b1) ... Selecting previously unselected package libicu72:amd64. Preparing to unpack .../24-libicu72_72.1-5+b1_amd64.deb ... Unpacking libicu72:amd64 (72.1-5+b1) ... Selecting previously unselected package libxml2:amd64. Preparing to unpack .../25-libxml2_2.12.7+dfsg+really2.9.14-0.2+b1_amd64.deb ... Unpacking libxml2:amd64 (2.12.7+dfsg+really2.9.14-0.2+b1) ... Selecting previously unselected package gettext. Preparing to unpack .../26-gettext_0.22.5-2_amd64.deb ... Unpacking gettext (0.22.5-2) ... Selecting previously unselected package intltool-debian. Preparing to unpack .../27-intltool-debian_0.35.0+20060710.6_all.deb ... Unpacking intltool-debian (0.35.0+20060710.6) ... Selecting previously unselected package po-debconf. Preparing to unpack .../28-po-debconf_1.0.21+nmu1_all.deb ... Unpacking po-debconf (1.0.21+nmu1) ... Selecting previously unselected package debhelper. Preparing to unpack .../29-debhelper_13.20_all.deb ... Unpacking debhelper (13.20) ... Selecting previously unselected package python3-autocommand. Preparing to unpack .../30-python3-autocommand_2.2.2-3_all.deb ... Unpacking python3-autocommand (2.2.2-3) ... Selecting previously unselected package python3-more-itertools. Preparing to unpack .../31-python3-more-itertools_10.5.0-1_all.deb ... Unpacking python3-more-itertools (10.5.0-1) ... Selecting previously unselected package python3-typing-extensions. Preparing to unpack .../32-python3-typing-extensions_4.12.2-2_all.deb ... Unpacking python3-typing-extensions (4.12.2-2) ... Selecting previously unselected package python3-typeguard. Preparing to unpack .../33-python3-typeguard_4.4.1-1_all.deb ... Unpacking python3-typeguard (4.4.1-1) ... Selecting previously unselected package python3-inflect. Preparing to unpack .../34-python3-inflect_7.3.1-2_all.deb ... Unpacking python3-inflect (7.3.1-2) ... Selecting previously unselected package python3-jaraco.context. Preparing to unpack .../35-python3-jaraco.context_6.0.0-1_all.deb ... Unpacking python3-jaraco.context (6.0.0-1) ... Selecting previously unselected package python3-jaraco.functools. Preparing to unpack .../36-python3-jaraco.functools_4.1.0-1_all.deb ... Unpacking python3-jaraco.functools (4.1.0-1) ... Selecting previously unselected package python3-pkg-resources. Preparing to unpack .../37-python3-pkg-resources_75.2.0-1_all.deb ... Unpacking python3-pkg-resources (75.2.0-1) ... Selecting previously unselected package python3-jaraco.text. Preparing to unpack .../38-python3-jaraco.text_4.0.0-1_all.deb ... Unpacking python3-jaraco.text (4.0.0-1) ... Selecting previously unselected package python3-zipp. Preparing to unpack .../39-python3-zipp_3.21.0-1_all.deb ... Unpacking python3-zipp (3.21.0-1) ... Selecting previously unselected package python3-setuptools. Preparing to unpack .../40-python3-setuptools_75.2.0-1_all.deb ... Unpacking python3-setuptools (75.2.0-1) ... Selecting previously unselected package dh-python. Preparing to unpack .../41-dh-python_6.20241024_all.deb ... Unpacking dh-python (6.20241024) ... Selecting previously unselected package python3-all. Preparing to unpack .../42-python3-all_3.12.6-1_amd64.deb ... Unpacking python3-all (3.12.6-1) ... Selecting previously unselected package python3-iniconfig. Preparing to unpack .../43-python3-iniconfig_1.1.1-2_all.deb ... Unpacking python3-iniconfig (1.1.1-2) ... Selecting previously unselected package python3-packaging. Preparing to unpack .../44-python3-packaging_24.2-1_all.deb ... Unpacking python3-packaging (24.2-1) ... Selecting previously unselected package python3-pluggy. Preparing to unpack .../45-python3-pluggy_1.5.0-1_all.deb ... Unpacking python3-pluggy (1.5.0-1) ... Selecting previously unselected package python3-pyparsing. Preparing to unpack .../46-python3-pyparsing_3.1.2-1_all.deb ... Unpacking python3-pyparsing (3.1.2-1) ... Selecting previously unselected package python3-pytest. Preparing to unpack .../47-python3-pytest_8.3.3-1_all.deb ... Unpacking python3-pytest (8.3.3-1) ... Selecting previously unselected package python3-rdflib. Preparing to unpack .../48-python3-rdflib_7.1.1-2_all.deb ... Unpacking python3-rdflib (7.1.1-2) ... Setting up media-types (10.1.0) ... Setting up libpipeline1:amd64 (1.5.8-1) ... Setting up libkeyutils1:amd64 (1.6.3-4) ... Setting up libicu72:amd64 (72.1-5+b1) ... Setting up bsdextrautils (2.40.2-12) ... Setting up libmagic-mgc (1:5.45-3+b1) ... Setting up libarchive-zip-perl (1.68-1) ... Setting up libtirpc-common (1.3.4+ds-1.3) ... Setting up libdebhelper-perl (13.20) ... Setting up libmagic1t64:amd64 (1:5.45-3+b1) ... Setting up gettext-base (0.22.5-2) ... Setting up m4 (1.4.19-4) ... Setting up libcom-err2:amd64 (1.47.2~rc1-2) ... Setting up file (1:5.45-3+b1) ... Setting up libelf1t64:amd64 (0.192-4) ... Setting up libkrb5support0:amd64 (1.21.3-3) ... Setting up tzdata (2024b-4) ... Current default time zone: 'Etc/UTC' Local time is now: Tue Jan 20 23:38:15 UTC 2026. Universal Time is now: Tue Jan 20 23:38:15 UTC 2026. Run 'dpkg-reconfigure tzdata' if you wish to change it. Setting up autotools-dev (20220109.1) ... Setting up autopoint (0.22.5-2) ... Setting up libk5crypto3:amd64 (1.21.3-3) ... Setting up autoconf (2.72-3) ... Setting up dwz (0.15-1+b1) ... Setting up sensible-utils (0.0.24) ... Setting up libuchardet0:amd64 (0.0.8-1+b2) ... Setting up netbase (6.4) ... Setting up libkrb5-3:amd64 (1.21.3-3) ... Setting up readline-common (8.2-6) ... Setting up libxml2:amd64 (2.12.7+dfsg+really2.9.14-0.2+b1) ... Setting up automake (1:1.16.5-1.3) ... update-alternatives: using /usr/bin/automake-1.16 to provide /usr/bin/automake (automake) in auto mode Setting up libfile-stripnondeterminism-perl (1.14.0-1) ... Setting up gettext (0.22.5-2) ... Setting up libtool (2.4.7-8) ... Setting up intltool-debian (0.35.0+20060710.6) ... Setting up dh-autoreconf (20) ... Setting up libgssapi-krb5-2:amd64 (1.21.3-3) ... Setting up libreadline8t64:amd64 (8.2-6) ... Setting up dh-strip-nondeterminism (1.14.0-1) ... Setting up groff-base (1.23.0-6) ... Setting up libtirpc3t64:amd64 (1.3.4+ds-1.3+b1) ... Setting up po-debconf (1.0.21+nmu1) ... Setting up man-db (2.13.0-1) ... Not building database; man-db/auto-update is not 'true'. Setting up libnsl2:amd64 (1.3.0-3+b3) ... Setting up libpython3.12-stdlib:amd64 (3.12.8-3) ... Setting up python3.12 (3.12.8-3) ... Setting up debhelper (13.20) ... Setting up libpython3-stdlib:amd64 (3.12.6-1) ... Setting up python3 (3.12.6-1) ... Setting up python3-zipp (3.21.0-1) ... Setting up python3-autocommand (2.2.2-3) ... Setting up python3-packaging (24.2-1) ... Setting up python3-pyparsing (3.1.2-1) ... Setting up python3-typing-extensions (4.12.2-2) ... Setting up python3-pluggy (1.5.0-1) ... Setting up python3-rdflib (7.1.1-2) ... Setting up python3-more-itertools (10.5.0-1) ... Setting up python3-iniconfig (1.1.1-2) ... Setting up python3-jaraco.functools (4.1.0-1) ... Setting up python3-jaraco.context (6.0.0-1) ... Setting up python3-pytest (8.3.3-1) ... Setting up python3-typeguard (4.4.1-1) ... Setting up python3-all (3.12.6-1) ... Setting up python3-inflect (7.3.1-2) ... Setting up python3-jaraco.text (4.0.0-1) ... Setting up python3-pkg-resources (75.2.0-1) ... Setting up python3-setuptools (75.2.0-1) ... Setting up dh-python (6.20241024) ... Processing triggers for libc-bin (2.40-4) ... Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... Building tag database... -> Finished parsing the build-deps I: Building the package I: user script /srv/workspace/pbuilder/2595058/tmp/hooks/A99_set_merged_usr starting Not re-configuring usrmerge for trixie I: user script /srv/workspace/pbuilder/2595058/tmp/hooks/A99_set_merged_usr finished hostname: Name or service not known I: Running cd /build/reproducible-path/sparql-wrapper-python-2.0.0/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path" HOME="/nonexistent/second-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path" HOME="/nonexistent/second-build" dpkg-genchanges -S > ../sparql-wrapper-python_2.0.0-2_source.changes dpkg-buildpackage: info: source package sparql-wrapper-python dpkg-buildpackage: info: source version 2.0.0-2 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Alexandre Detiste dpkg-source --before-build . dpkg-buildpackage: info: host architecture amd64 debian/rules clean dh clean --buildsystem=pybuild dh_auto_clean -O--buildsystem=pybuild I: pybuild base:311: python3.12 setup.py clean running clean removing '/build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build' (and everything under it) 'build/bdist.linux-x86_64' does not exist -- can't clean it 'build/scripts-3.12' does not exist -- can't clean it dh_autoreconf_clean -O--buildsystem=pybuild dh_clean -O--buildsystem=pybuild debian/rules binary dh binary --buildsystem=pybuild dh_update_autotools_config -O--buildsystem=pybuild dh_autoreconf -O--buildsystem=pybuild dh_auto_configure -O--buildsystem=pybuild I: pybuild base:311: python3.12 setup.py config running config dh_auto_build -O--buildsystem=pybuild I: pybuild base:311: /usr/bin/python3 setup.py build running build running build_py creating /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/SmartWrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/__init__.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/main.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/Wrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/sparql_dataframe.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper I: pybuild pybuild:334: cp -r test /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build debian/rules override_dh_auto_test make[1]: Entering directory '/build/reproducible-path/sparql-wrapper-python-2.0.0' # tests need a remote server dh_auto_test || : I: pybuild base:311: cd /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build; python3.12 -m pytest test ============================= test session starts ============================== platform linux -- Python 3.12.8, pytest-8.3.3, pluggy-1.5.0 rootdir: /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build configfile: pyproject.toml plugins: typeguard-4.4.1 collected 1525 items test/test_agrovoc-allegrograph_on_hold.py sFxxsFFsFFxsFFxxsFFFFxxsFFFFxx [ 1%] sFFFFxxsFFFFFFFFssFFFxxFFxFFxxFFF [ 4%] test/test_allegrograph__v4_14_1__mmi.py ssFFFFFFssFFFFssFFFFFFssFFFFFFss [ 6%] FFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFFFssFFFFFF [ 10%] FFFFFFFFFFFFFFFFFFFFFFF [ 12%] test/test_blazegraph__wikidata.py ssFFFFFFssFFFFssFFFFFFssFFFFFFsFsFsFFF [ 14%] sFFFFFFFsFsFsFFFsFFFFFFFsFsFsFFFsFFFFFFFsFsFsFFFsFFFFFFFsFFsFsFFFFFFFsFF [ 19%] FFFsFFFFFFFsFFFFF [ 20%] test/test_cli.py ..F...FFFFFFFFFFFFFFFFFFFFFF [ 22%] test/test_fuseki2__v3_6_0__agrovoc.py FFFsFFsFFFFFFFFFFsFFsFFFFFFFsFFFFF [ 24%] sFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFFFFFFsFFFFsFFs [ 29%] FFFFFFFFFFsFFsFFFFFFF [ 30%] test/test_fuseki2__v3_8_0__stw.py FFFsFFsFFFFFFFFFFsFFsFFFFFFFsFFFFFsFsF [ 33%] sFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFFFFFFsFFFFsFFsFFFF [ 38%] FFFFFFsFFsFFFFFFF [ 39%] test/test_graphdbEnterprise__v8_9_0__rs.py ssssFFsFsssFsFssssFFsFsssFsFs [ 41%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFFsFFsFsF [ 45%] ssFFsFsFsFsFsFssFFsFsFsFsF [ 47%] test/test_lov-fuseki_on_hold.py FFFFFFFFFFFFFFssssssssssssssFFFFFFFFFFFF [ 50%] FFFFssssssssssssssssFFFFFFFFFFFFFFFFssssssssssssssssFsFFssFFFFFFFFFFFFFF [ 54%] Fssssssssssssss [ 55%] test/test_rdf4j__geosciml.py ssssFFsFsssFsFssssFFsFsssFsFsFsFsFsFsFsFsFs [ 58%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFFssFFsFsFssFFsFsFsFsFs [ 63%] FssFFsFsFsFsF [ 64%] test/test_stardog__lindas.py ssssFFsFsssFsFssssFFsFsssFsFsFsFsFsFsFsFsFs [ 67%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFssFFFsFsFssFFsFsFsFsFs [ 71%] FssFFsFsFsFsF [ 72%] test/test_store__v1_1_4.py FFFsFFsFsFxFxFxxxxxxxxxxxxxxsFsssFsFsFsFxFxFx [ 75%] xssxxxxxxxxxxxxsFsssFsssFssxFxFxxssxxxxxxxxxxxxFFFFssFFFFsFFsFsFxFxFxxxx [ 80%] xxxxxxxxxx [ 81%] test/test_virtuoso__v7_20_3230__dbpedia.py FFFssFssFFFFFFsssssFsssssssss [ 82%] FFFssFFFFFFFFFFsFssssFssssssFsssFFFssFFFFFFFFFFssssssssssssssssFFFFssFFF [ 87%] FFFFssFFFFFFsssFFsssssssss [ 89%] test/test_virtuoso__v8_03_3313__dbpedia.py FFFssFssFFFFFFsssssssssssssss [ 91%] FFFssFFFFFFFFFFsssssssssssssssssFFFssFFFFFFFFFFssssssssssssssssFFFFFsFFF [ 96%] FFFFssFFFFFFssssssssssssss [ 97%] test/test_wrapper.py ....s..........................F... [100%] =================================== FAILURES =================================== ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_agrovoc-allegrograph_on_hold.py:403: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_agrovoc-allegrograph_on_hold.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:345: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) test/test_agrovoc-allegrograph_on_hold.py:410: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinJSONLD ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD(self): > result = self.__generic(askQuery, JSONLD, POST) test/test_agrovoc-allegrograph_on_hold.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) test/test_agrovoc-allegrograph_on_hold.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) test/test_agrovoc-allegrograph_on_hold.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) test/test_agrovoc-allegrograph_on_hold.py:513: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_agrovoc-allegrograph_on_hold.py:499: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_agrovoc-allegrograph_on_hold.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:485: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) test/test_agrovoc-allegrograph_on_hold.py:520: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) test/test_agrovoc-allegrograph_on_hold.py:506: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) test/test_agrovoc-allegrograph_on_hold.py:601: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) test/test_agrovoc-allegrograph_on_hold.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) test/test_agrovoc-allegrograph_on_hold.py:643: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_agrovoc-allegrograph_on_hold.py:629: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_agrovoc-allegrograph_on_hold.py:724: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:615: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3(self): > result = self.__generic(describeQuery, N3, POST) test/test_agrovoc-allegrograph_on_hold.py:650: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) test/test_agrovoc-allegrograph_on_hold.py:636: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) test/test_agrovoc-allegrograph_on_hold.py:732: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) test/test_agrovoc-allegrograph_on_hold.py:622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_agrovoc-allegrograph_on_hold.py:757: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:742: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:748: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:745: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:769: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_agrovoc-allegrograph_on_hold.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_agrovoc-allegrograph_on_hold.py:260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_agrovoc-allegrograph_on_hold.py:246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_agrovoc-allegrograph_on_hold.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) test/test_agrovoc-allegrograph_on_hold.py:239: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) test/test_agrovoc-allegrograph_on_hold.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) test/test_agrovoc-allegrograph_on_hold.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) test/test_agrovoc-allegrograph_on_hold.py:329: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) test/test_agrovoc-allegrograph_on_hold.py:224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_allegrograph__v4_14_1__mmi.py:572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) test/test_allegrograph__v4_14_1__mmi.py:647: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:658: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:579: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) test/test_allegrograph__v4_14_1__mmi.py:603: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:614: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_allegrograph__v4_14_1__mmi.py:689: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:698: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:460: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:468: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) test/test_allegrograph__v4_14_1__mmi.py:586: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '515', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, POST) test/test_allegrograph__v4_14_1__mmi.py:669: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '515', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:680: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '475', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected(self): > result = self.__generic(askQuery, N3, POST) test/test_allegrograph__v4_14_1__mmi.py:625: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '475', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:636: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) test/test_allegrograph__v4_14_1__mmi.py:707: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:716: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) test/test_allegrograph__v4_14_1__mmi.py:476: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:484: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) test/test_allegrograph__v4_14_1__mmi.py:885: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:894: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) test/test_allegrograph__v4_14_1__mmi.py:921: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:930: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) test/test_allegrograph__v4_14_1__mmi.py:823: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:830: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_allegrograph__v4_14_1__mmi.py:760: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:768: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_allegrograph__v4_14_1__mmi.py:956: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:964: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:738: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, POST) test/test_allegrograph__v4_14_1__mmi.py:903: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '665', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) test/test_allegrograph__v4_14_1__mmi.py:939: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '665', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:948: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '659', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) test/test_allegrograph__v4_14_1__mmi.py:837: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '659', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:844: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '768', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) test/test_allegrograph__v4_14_1__mmi.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '768', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:784: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) test/test_allegrograph__v4_14_1__mmi.py:972: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:980: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) test/test_allegrograph__v4_14_1__mmi.py:745: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:752: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) test/test_allegrograph__v4_14_1__mmi.py:1148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1158: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) test/test_allegrograph__v4_14_1__mmi.py:1185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) test/test_allegrograph__v4_14_1__mmi.py:1086: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1093: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_allegrograph__v4_14_1__mmi.py:1023: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1031: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_allegrograph__v4_14_1__mmi.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:994: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1001: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, POST) test/test_allegrograph__v4_14_1__mmi.py:1167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1176: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '468', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) test/test_allegrograph__v4_14_1__mmi.py:1203: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '468', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '462', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3(self): > result = self.__generic(describeQuery, N3, POST) test/test_allegrograph__v4_14_1__mmi.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '462', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '571', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) test/test_allegrograph__v4_14_1__mmi.py:1039: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '571', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1047: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) test/test_allegrograph__v4_14_1__mmi.py:1236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) test/test_allegrograph__v4_14_1__mmi.py:1008: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1015: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_allegrograph__v4_14_1__mmi.py:1269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:1254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:1260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:1281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_allegrograph__v4_14_1__mmi.py:245: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:252: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_allegrograph__v4_14_1__mmi.py:301: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) test/test_allegrograph__v4_14_1__mmi.py:376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:387: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) test/test_allegrograph__v4_14_1__mmi.py:332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:343: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_allegrograph__v4_14_1__mmi.py:273: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:280: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_allegrograph__v4_14_1__mmi.py:418: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:427: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:213: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '664', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) test/test_allegrograph__v4_14_1__mmi.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '664', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '487', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) test/test_allegrograph__v4_14_1__mmi.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '487', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) test/test_allegrograph__v4_14_1__mmi.py:398: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:322: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, POST) test/test_allegrograph__v4_14_1__mmi.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '764', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) test/test_allegrograph__v4_14_1__mmi.py:287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '764', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) test/test_allegrograph__v4_14_1__mmi.py:436: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:445: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) test/test_allegrograph__v4_14_1__mmi.py:229: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_blazegraph__wikidata.py:580: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) test/test_blazegraph__wikidata.py:655: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:666: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) test/test_blazegraph__wikidata.py:611: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_blazegraph__wikidata.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_blazegraph__wikidata.py:706: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_blazegraph__wikidata.py:484: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '423', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) test/test_blazegraph__wikidata.py:594: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '423', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '457', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, POST) test/test_blazegraph__wikidata.py:677: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '457', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:688: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:601: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '417', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected(self): > result = self.__generic(askQuery, N3, POST) test/test_blazegraph__wikidata.py:633: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '417', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:644: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) test/test_blazegraph__wikidata.py:715: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_blazegraph__wikidata.py:724: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) test/test_blazegraph__wikidata.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:508: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) test/test_blazegraph__wikidata.py:915: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:924: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:887: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:962: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:849: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_blazegraph__wikidata.py:768: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:811: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_blazegraph__wikidata.py:990: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_blazegraph__wikidata.py:998: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_blazegraph__wikidata.py:739: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:746: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, POST) test/test_blazegraph__wikidata.py:933: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:942: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:906: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:982: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:868: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '755', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) test/test_blazegraph__wikidata.py:784: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '755', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:792: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:830: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) test/test_blazegraph__wikidata.py:1006: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1014: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) test/test_blazegraph__wikidata.py:753: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:760: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) test/test_blazegraph__wikidata.py:1200: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1210: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1138: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_blazegraph__wikidata.py:1057: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1065: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_blazegraph__wikidata.py:1276: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_blazegraph__wikidata.py:1028: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1035: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, POST) test/test_blazegraph__wikidata.py:1219: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1157: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) test/test_blazegraph__wikidata.py:1073: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1081: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1119: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) test/test_blazegraph__wikidata.py:1292: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) test/test_blazegraph__wikidata.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1049: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_blazegraph__wikidata.py:1328: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_blazegraph__wikidata.py:1310: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_blazegraph__wikidata.py:1313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_blazegraph__wikidata.py:1332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_blazegraph__wikidata.py:1341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_blazegraph__wikidata.py:325: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) test/test_blazegraph__wikidata.py:400: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) test/test_blazegraph__wikidata.py:356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:367: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:301: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_blazegraph__wikidata.py:442: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_blazegraph__wikidata.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_blazegraph__wikidata.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:233: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '442', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) test/test_blazegraph__wikidata.py:339: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '442', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '476', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) test/test_blazegraph__wikidata.py:422: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '476', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, POST) test/test_blazegraph__wikidata.py:378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:389: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:318: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) test/test_blazegraph__wikidata.py:460: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_blazegraph__wikidata.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) test/test_blazegraph__wikidata.py:241: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:249: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperCLIParser_Test.testInvalidFormat _________________ self = def testInvalidFormat(self): with self.assertRaises(SystemExit) as cm: parse_args(["-Q", testquery, "-F", "jjssoonn"]) self.assertEqual(cm.exception.code, 2) > self.assertEqual( sys.stderr.getvalue().split("\n")[1], "rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rdf+xml', 'csv', 'tsv', 'json-ld')", ) E AssertionError: "rqw:[65 chars]from json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld)" != "rqw:[65 chars]from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rd[28 chars]ld')" E - rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld) E + rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rdf+xml', 'csv', 'tsv', 'json-ld') E ? + + + + + + + + + + + + + + + + + + test/test_cli.py:79: AssertionError ______________________ SPARQLWrapperCLI_Test.testQueryRDF ______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryRDF(self): > main(["-Q", "DESCRIBE ", "-e", endpoint, "-F", "rdf"]) test/test_cli.py:249: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperCLI_Test.testQueryTo4store ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryTo4store(self): > main(["-e", "http://rdf.chise.org/sparql", "-Q", testquery]) test/test_cli.py:627: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperCLI_Test.testQueryToAgrovoc_AllegroGraph _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToAgrovoc_AllegroGraph(self): > main(["-e", "https://agrovoc.fao.org/sparql", "-Q", testquery]) test/test_cli.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperCLI_Test.testQueryToAllegroGraph _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToAllegroGraph(self): > main(["-e", "https://mmisw.org/sparql", "-Q", testquery]) test/test_cli.py:378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperCLI_Test.testQueryToBrazeGraph __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToBrazeGraph(self): > main(["-e", "https://query.wikidata.org/sparql", "-Q", testquery]) test/test_cli.py:546: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperCLI_Test.testQueryToFuseki2V3_6 _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToFuseki2V3_6(self): > main(["-e", "https://agrovoc.uniroma2.it/sparql/", "-Q", testquery]) test/test_cli.py:573: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperCLI_Test.testQueryToFuseki2V3_8 _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToFuseki2V3_8(self): > main(["-e", "http://zbw.eu/beta/sparql/stw/query", "-Q", testquery]) test/test_cli.py:600: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperCLI_Test.testQueryToGraphDBEnterprise ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToGraphDBEnterprise(self): > main(["-e", "http://factforge.net/repositories/ff-news", "-Q", testquery]) test/test_cli.py:405: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperCLI_Test.testQueryToLovFuseki __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToLovFuseki(self): > main(["-e", "https://lov.linkeddata.es/dataset/lov/sparql/", "-Q", testquery]) test/test_cli.py:317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperCLI_Test.testQueryToRDF4J ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToRDF4J(self): > main( [ "-e", "http://vocabs.ands.org.au/repository/api/sparql/csiro_international-chronostratigraphic-chart_2018-revised-corrected", "-Q", testquery, ] ) test/test_cli.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperCLI_Test.testQueryToStardog ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '102', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToStardog(self): > main(["-e", "https://lindas.admin.ch/query", "-Q", testquery, "-m", POST]) test/test_cli.py:432: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '102', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperCLI_Test.testQueryToVirtuosoV7 __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToVirtuosoV7(self): > main(["-e", "http://dbpedia.org/sparql", "-Q", testquery]) test/test_cli.py:516: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperCLI_Test.testQueryToVirtuosoV8 __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToVirtuosoV8(self): > main(["-e", "http://dbpedia-live.openlinksw.com/sparql", "-Q", testquery]) test/test_cli.py:486: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperCLI_Test.testQueryWithEndpoint __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithEndpoint(self): > main( [ "-Q", testquery, "-e", endpoint, ] ) test/test_cli.py:97: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperCLI_Test.testQueryWithFile ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFile(self): > main(["-f", testfile, "-e", endpoint]) test/test_cli.py:135: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileCSV __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileCSV(self): > main(["-f", testfile, "-e", endpoint, "-F", "csv"]) test/test_cli.py:291: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileN3 ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileN3(self): > main(["-f", testfile, "-e", endpoint, "-F", "n3"]) test/test_cli.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperCLI_Test.testQueryWithFileRDFXML _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileRDFXML(self): > main(["-f", testfile, "-e", endpoint, "-F", "rdf+xml"]) test/test_cli.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileTSV __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileTSV(self): > main(["-f", testfile, "-e", endpoint, "-F", "tsv"]) test/test_cli.py:304: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperCLI_Test.testQueryWithFileTurtle _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileTurtle(self): > main(["-f", testfile, "-e", endpoint, "-F", "turtle"]) test/test_cli.py:188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperCLI_Test.testQueryWithFileTurtleQuiet ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileTurtleQuiet(self): > main( [ "-f", testfile, "-e", endpoint, "-F", "turtle", "-q", ] ) test/test_cli.py:205: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileXML(self): > main(["-f", testfile, "-e", endpoint, "-F", "xml"]) test/test_cli.py:167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_fuseki2__v3_6_0__agrovoc.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:496: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_fuseki2__v3_6_0__agrovoc.py:545: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:629: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:552: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) test/test_fuseki2__v3_6_0__agrovoc.py:517: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:524: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_fuseki2__v3_6_0__agrovoc.py:658: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:457: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:465: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV(self): > result = self.__generic(askQuery, CSV, POST) test/test_fuseki2__v3_6_0__agrovoc.py:503: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinCSV_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:510: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '339', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) test/test_fuseki2__v3_6_0__agrovoc.py:559: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '339', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:650: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:566: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:608: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV(self): > result = self.__generic(askQuery, TSV, POST) test/test_fuseki2__v3_6_0__agrovoc.py:531: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinTSV_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:538: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) test/test_fuseki2__v3_6_0__agrovoc.py:676: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:685: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) test/test_fuseki2__v3_6_0__agrovoc.py:473: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:481: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:874: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) test/test_fuseki2__v3_6_0__agrovoc.py:831: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:839: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) test/test_fuseki2__v3_6_0__agrovoc.py:901: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:910: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:806: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:738: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:772: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_fuseki2__v3_6_0__agrovoc.py:935: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:943: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:700: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:707: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:893: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinJSONLD ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '702', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, POST) test/test_fuseki2__v3_6_0__agrovoc.py:847: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '702', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:855: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '527', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) test/test_fuseki2__v3_6_0__agrovoc.py:918: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '527', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:927: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:823: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:755: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) test/test_fuseki2__v3_6_0__agrovoc.py:951: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:959: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) test/test_fuseki2__v3_6_0__agrovoc.py:714: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:721: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1143: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1110: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1170: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1179: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1079: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1011: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1045: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_fuseki2__v3_6_0__agrovoc.py:1204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:973: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:980: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '501', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, POST) test/test_fuseki2__v3_6_0__agrovoc.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '501', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '326', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) test/test_fuseki2__v3_6_0__agrovoc.py:1187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '326', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1096: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1028: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1062: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) test/test_fuseki2__v3_6_0__agrovoc.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) test/test_fuseki2__v3_6_0__agrovoc.py:987: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:994: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_fuseki2__v3_6_0__agrovoc.py:1253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1238: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1241: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_fuseki2__v3_6_0__agrovoc.py:246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_fuseki2__v3_6_0__agrovoc.py:302: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:309: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_fuseki2__v3_6_0__agrovoc.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_fuseki2__v3_6_0__agrovoc.py:415: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:424: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '466', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) test/test_fuseki2__v3_6_0__agrovoc.py:260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '466', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '393', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) test/test_fuseki2__v3_6_0__agrovoc.py:316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '393', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '566', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) test/test_fuseki2__v3_6_0__agrovoc.py:288: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '566', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:295: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) test/test_fuseki2__v3_6_0__agrovoc.py:433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:442: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) test/test_fuseki2__v3_6_0__agrovoc.py:230: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:238: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_fuseki2__v3_8_0__stw.py:493: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_fuseki2__v3_8_0__stw.py:549: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:633: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:591: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) test/test_fuseki2__v3_8_0__stw.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:528: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_fuseki2__v3_8_0__stw.py:662: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:671: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_fuseki2__v3_8_0__stw.py:461: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinCSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV(self): > result = self.__generic(askQuery, CSV, POST) test/test_fuseki2__v3_8_0__stw.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinCSV_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:514: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '333', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) test/test_fuseki2__v3_8_0__stw.py:563: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '333', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:654: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:570: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:612: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinTSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '430', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV(self): > result = self.__generic(askQuery, TSV, POST) test/test_fuseki2__v3_8_0__stw.py:535: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '430', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinTSV_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:542: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) test/test_fuseki2__v3_8_0__stw.py:680: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:689: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) test/test_fuseki2__v3_8_0__stw.py:477: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:485: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:878: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) test/test_fuseki2__v3_8_0__stw.py:835: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:843: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) test/test_fuseki2__v3_8_0__stw.py:905: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:914: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:810: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:742: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_fuseki2__v3_8_0__stw.py:939: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:947: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_fuseki2__v3_8_0__stw.py:704: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:711: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:897: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinJSONLD ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '696', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, POST) test/test_fuseki2__v3_8_0__stw.py:851: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '696', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:859: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) test/test_fuseki2__v3_8_0__stw.py:922: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:931: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:827: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:759: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:793: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) test/test_fuseki2__v3_8_0__stw.py:955: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:963: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) test/test_fuseki2__v3_8_0__stw.py:718: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:725: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) test/test_fuseki2__v3_8_0__stw.py:1107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1114: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) test/test_fuseki2__v3_8_0__stw.py:1174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1183: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1083: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1015: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1049: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_fuseki2__v3_8_0__stw.py:1208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_fuseki2__v3_8_0__stw.py:977: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:984: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '495', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, POST) test/test_fuseki2__v3_8_0__stw.py:1121: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '495', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1128: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '320', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) test/test_fuseki2__v3_8_0__stw.py:1191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '320', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1200: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1032: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1066: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) test/test_fuseki2__v3_8_0__stw.py:1224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) test/test_fuseki2__v3_8_0__stw.py:991: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:998: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_fuseki2__v3_8_0__stw.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1242: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1245: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1261: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_fuseki2__v3_8_0__stw.py:250: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_fuseki2__v3_8_0__stw.py:306: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:390: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:348: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_fuseki2__v3_8_0__stw.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_fuseki2__v3_8_0__stw.py:419: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_fuseki2__v3_8_0__stw.py:218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:226: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) test/test_fuseki2__v3_8_0__stw.py:264: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '387', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) test/test_fuseki2__v3_8_0__stw.py:320: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '387', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:369: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '537', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) test/test_fuseki2__v3_8_0__stw.py:292: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '537', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:299: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) test/test_fuseki2__v3_8_0__stw.py:437: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:446: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) test/test_fuseki2__v3_8_0__stw.py:234: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:242: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:663: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:585: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:621: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:702: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:488: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:684: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:600: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:642: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:721: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:505: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:898: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:864: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:936: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:834: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:774: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:804: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:972: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:744: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:917: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:879: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:955: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:849: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:819: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:989: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:759: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1132: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1102: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1072: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1012: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1223: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1057: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1087: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1027: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_graphdbEnterprise__v8_9_0__rs.py:1286: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_graphdbEnterprise__v8_9_0__rs.py:1268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_graphdbEnterprise__v8_9_0__rs.py:1271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_graphdbEnterprise__v8_9_0__rs.py:1290: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_graphdbEnterprise__v8_9_0__rs.py:1298: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:329: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:299: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:445: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:314: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_lov-fuseki_on_hold.py:536: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:543: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_lov-fuseki_on_hold.py:604: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) test/test_lov-fuseki_on_hold.py:687: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:611: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) test/test_lov-fuseki_on_hold.py:641: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:651: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) test/test_lov-fuseki_on_hold.py:570: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:577: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_lov-fuseki_on_hold.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:740: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_lov-fuseki_on_hold.py:498: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:506: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) test/test_lov-fuseki_on_hold.py:967: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:976: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) test/test_lov-fuseki_on_hold.py:928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:936: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) test/test_lov-fuseki_on_hold.py:1010: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1019: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) test/test_lov-fuseki_on_hold.py:890: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:898: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_lov-fuseki_on_hold.py:814: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:822: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) test/test_lov-fuseki_on_hold.py:852: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:860: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_lov-fuseki_on_hold.py:1052: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1060: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_lov-fuseki_on_hold.py:779: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:786: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) test/test_lov-fuseki_on_hold.py:1280: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) test/test_lov-fuseki_on_hold.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) test/test_lov-fuseki_on_hold.py:1323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) test/test_lov-fuseki_on_hold.py:1207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_lov-fuseki_on_hold.py:1131: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1139: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) test/test_lov-fuseki_on_hold.py:1169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1177: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_lov-fuseki_on_hold.py:1365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1373: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_lov-fuseki_on_hold.py:1096: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_lov-fuseki_on_hold.py:1423: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_lov-fuseki_on_hold.py:1414: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_lov-fuseki_on_hold.py:1411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_lov-fuseki_on_hold.py:1443: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_lov-fuseki_on_hold.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_lov-fuseki_on_hold.py:323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) test/test_lov-fuseki_on_hold.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:416: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) test/test_lov-fuseki_on_hold.py:360: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:370: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_lov-fuseki_on_hold.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_lov-fuseki_on_hold.py:450: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_lov-fuseki_on_hold.py:217: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:674: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:591: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:628: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_rdf4j__geosciml.py:716: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:606: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:651: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_rdf4j__geosciml.py:735: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:511: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:878: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:950: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:848: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:818: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_rdf4j__geosciml.py:986: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:758: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:931: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:893: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:969: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:863: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:803: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:833: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1003: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:773: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1146: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1116: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1056: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1086: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1025: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1161: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1131: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1071: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1101: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1041: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_rdf4j__geosciml.py:1305: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryBadFormed_1 ____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed_1(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed_1, XML, GET) test/test_rdf4j__geosciml.py:1282: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_rdf4j__geosciml.py:1289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_rdf4j__geosciml.py:1309: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_rdf4j__geosciml.py:1317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:363: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_rdf4j__geosciml.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:233: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:432: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_rdf4j__geosciml.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:250: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_stardog__lindas.py:678: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_stardog__lindas.py:595: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_stardog__lindas.py:632: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_stardog__lindas.py:720: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_stardog__lindas.py:498: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_stardog__lindas.py:701: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_stardog__lindas.py:610: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_stardog__lindas.py:655: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_stardog__lindas.py:739: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_stardog__lindas.py:515: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_stardog__lindas.py:916: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_stardog__lindas.py:882: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_stardog__lindas.py:954: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_stardog__lindas.py:852: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_stardog__lindas.py:792: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_stardog__lindas.py:822: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_stardog__lindas.py:990: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_stardog__lindas.py:762: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_stardog__lindas.py:935: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_stardog__lindas.py:897: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_stardog__lindas.py:973: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_stardog__lindas.py:867: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_stardog__lindas.py:807: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_stardog__lindas.py:837: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_stardog__lindas.py:1007: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_stardog__lindas.py:777: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_stardog__lindas.py:1183: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_stardog__lindas.py:1149: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_stardog__lindas.py:1221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_stardog__lindas.py:1119: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_stardog__lindas.py:1059: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_stardog__lindas.py:1089: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_stardog__lindas.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_stardog__lindas.py:1029: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_stardog__lindas.py:1202: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_stardog__lindas.py:1164: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_stardog__lindas.py:1240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_stardog__lindas.py:1134: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_stardog__lindas.py:1074: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_stardog__lindas.py:1104: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_stardog__lindas.py:1274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_stardog__lindas.py:1044: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_stardog__lindas.py:1307: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_stardog__lindas.py:1298: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_stardog__lindas.py:1293: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_stardog__lindas.py:1311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_stardog__lindas.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_stardog__lindas.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_stardog__lindas.py:413: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_stardog__lindas.py:330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_stardog__lindas.py:367: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_stardog__lindas.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_stardog__lindas.py:455: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_stardog__lindas.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_stardog__lindas.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_stardog__lindas.py:436: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_stardog__lindas.py:345: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_stardog__lindas.py:390: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_stardog__lindas.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_stardog__lindas.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_stardog__lindas.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_store__v1_1_4.py:520: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:527: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_store__v1_1_4.py:583: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_store__v1_1_4.py:673: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_store__v1_1_4.py:590: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_store__v1_1_4.py:627: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:560: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_store__v1_1_4.py:718: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_store__v1_1_4.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:942: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_store__v1_1_4.py:981: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_store__v1_1_4.py:872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_store__v1_1_4.py:797: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_store__v1_1_4.py:834: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_store__v1_1_4.py:1020: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_store__v1_1_4.py:763: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:1246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_store__v1_1_4.py:1287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_store__v1_1_4.py:1097: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_store__v1_1_4.py:1326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_store__v1_1_4.py:1062: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_store__v1_1_4.py:1371: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_store__v1_1_4.py:1356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_store__v1_1_4.py:1362: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_store__v1_1_4.py:1359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_store__v1_1_4.py:1387: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_store__v1_1_4.py:247: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_store__v1_1_4.py:310: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_store__v1_1_4.py:403: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_store__v1_1_4.py:317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_store__v1_1_4.py:357: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_store__v1_1_4.py:448: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_store__v1_1_4.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1373: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:526: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:533: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:586: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:563: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_virtuoso__v7_20_3230__dbpedia.py:728: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:737: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '228', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:608: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '228', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:950: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:895: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:904: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:866: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:873: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:802: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:809: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:833: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:841: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1048: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1056: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:772: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:779: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:977: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) test/test_virtuoso__v7_20_3230__dbpedia.py:880: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1073: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1181: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1364: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1372: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1087: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1094: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_virtuoso__v7_20_3230__dbpedia.py:1416: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1401: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1404: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_virtuoso__v7_20_3230__dbpedia.py:448: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:457: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '349', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) test/test_virtuoso__v7_20_3230__dbpedia.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '349', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '278', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:439: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '278', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:528: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:535: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:588: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:595: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:565: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_virtuoso__v8_03_3313__dbpedia.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:740: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:502: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:954: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:899: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:908: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:869: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:876: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:805: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:812: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:836: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:844: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1053: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1061: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:775: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:782: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1272: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1217: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1226: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1123: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1154: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1370: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1093: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_virtuoso__v8_03_3313__dbpedia.py:1422: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1413: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1410: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1426: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_virtuoso__v8_03_3313__dbpedia.py:450: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.12/urllib/request.py:1344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/http/client.py:1336: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.12/http/client.py:1382: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1331: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.12/http/client.py:1091: in _send_output self.send(msg) /usr/lib/python3.12/http/client.py:1035: in send self.connect() /usr/lib/python3.12/http/client.py:1470: in connect super().connect() /usr/lib/python3.12/http/client.py:1001: in connect self.sock = self._create_connection( /usr/lib/python3.12/socket.py:865: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.12/socket.py:850: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.12/urllib/request.py:215: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.12/urllib/request.py:515: in open response = self._open(req, data) /usr/lib/python3.12/urllib/request.py:532: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.12/urllib/request.py:492: in _call_chain result = func(*args) /usr/lib/python3.12/urllib/request.py:1392: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.12/urllib/request.py:1347: URLError _________________________ QueryResult_Test.testConvert _________________________ self = def testConvert(self): class FakeResponse(object): def __init__(self, content_type): self.content_type = content_type def info(self): return {"content-type": self.content_type} def read(self, len): return "" def _mime_vs_type(mime, requested_type): """ :param mime: mimetype/Content-Type of the response :param requested_type: requested mimetype (alias) :return: number of warnings produced by combo """ with warnings.catch_warnings(record=True) as w: qr = QueryResult((FakeResponse(mime), requested_type)) try: qr.convert() except: pass # if len(w) > 0: print(w[0].message) # FOR DEBUG # if len(w) > 1: print(w[1].message) # FOR DEBUG return len(w) # In the cases of "application/ld+json" and "application/rdf+xml", the # RDFLib raised a warning because the manually created QueryResult has no real # response value (implemented a fake read). # "WARNING:rdflib.term: does not look like a valid URI, trying to serialize this will break." self.assertEqual(0, _mime_vs_type("application/sparql-results+xml", XML)) self.assertEqual(0, _mime_vs_type("application/sparql-results+json", JSON)) self.assertEqual(0, _mime_vs_type("text/n3", N3)) self.assertEqual(0, _mime_vs_type("text/turtle", TURTLE)) self.assertEqual(0, _mime_vs_type("application/turtle", TURTLE)) self.assertEqual(0, _mime_vs_type("application/json", JSON)) > self.assertEqual(0, _mime_vs_type("application/ld+json", JSONLD)) E AssertionError: 0 != 1 test/test_wrapper.py:876: AssertionError =============================== warnings summary =============================== test/test_agrovoc-allegrograph_on_hold.py:164 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_agrovoc-allegrograph_on_hold.py:164: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_allegrograph__v4_14_1__mmi.py:163 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_allegrograph__v4_14_1__mmi.py:163: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_blazegraph__wikidata.py:172 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_blazegraph__wikidata.py:172: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_fuseki2__v3_6_0__agrovoc.py:164 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_fuseki2__v3_6_0__agrovoc.py:164: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_fuseki2__v3_8_0__stw.py:165 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_fuseki2__v3_8_0__stw.py:165: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_graphdbEnterprise__v8_9_0__rs.py:176 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_graphdbEnterprise__v8_9_0__rs.py:176: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_lov-fuseki_on_hold.py:167 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_lov-fuseki_on_hold.py:167: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_rdf4j__geosciml.py:173 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_rdf4j__geosciml.py:173: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_stardog__lindas.py:177 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_stardog__lindas.py:177: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_store__v1_1_4.py:162 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_store__v1_1_4.py:162: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_virtuoso__v7_20_3230__dbpedia.py:164 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_virtuoso__v7_20_3230__dbpedia.py:164: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_virtuoso__v8_03_3313__dbpedia.py:164 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test/test_virtuoso__v8_03_3313__dbpedia.py:164: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json-ld' in a 'ASK' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'n3' in a 'ASK' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 4 warnings test/test_allegrograph__v4_14_1__mmi.py: 8 warnings test/test_blazegraph__wikidata.py: 8 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 8 warnings test/test_fuseki2__v3_8_0__stw.py: 8 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 4 warnings test/test_lov-fuseki_on_hold.py: 8 warnings test/test_rdf4j__geosciml.py: 4 warnings test/test_stardog__lindas.py: 4 warnings test/test_store__v1_1_4.py: 8 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 8 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 8 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:348: SyntaxWarning: Ignore format 'foo'; current instance supports: json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld. warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 4 warnings test/test_allegrograph__v4_14_1__mmi.py: 8 warnings test/test_blazegraph__wikidata.py: 8 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 8 warnings test/test_fuseki2__v3_8_0__stw.py: 8 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 4 warnings test/test_rdf4j__geosciml.py: 4 warnings test/test_stardog__lindas.py: 4 warnings test/test_store__v1_1_4.py: 8 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:348: SyntaxWarning: Ignore format 'bar'; current instance supports: json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld. warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 2 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'csv' in a 'CONSTRUCT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 2 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 4 warnings test/test_fuseki2__v3_8_0__stw.py: 4 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json' in a 'CONSTRUCT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'csv' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 2 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 4 warnings test/test_fuseki2__v3_8_0__stw.py: 4 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 1 warning test/test_allegrograph__v4_14_1__mmi.py: 1 warning test/test_blazegraph__wikidata.py: 1 warning test/test_fuseki2__v3_6_0__agrovoc.py: 1 warning test/test_fuseki2__v3_8_0__stw.py: 1 warning test/test_graphdbEnterprise__v8_9_0__rs.py: 1 warning test/test_lov-fuseki_on_hold.py: 1 warning test/test_rdf4j__geosciml.py: 1 warning test/test_stardog__lindas.py: 1 warning test/test_store__v1_1_4.py: 1 warning test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:663: UserWarning: keepalive support not available, so the execution of this method has no effect warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 4 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 2 warnings test/test_wrapper.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json-ld' in a 'SELECT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_cli.py: 1 warning test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'n3' in a 'SELECT' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryRDF /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'rdf' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileRDFXML /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'rdf+xml' in a 'SELECT' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtle /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'turtle' in a 'SELECT' SPARQL query form warnings.warn( -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinJSONLD FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_cli.py::SPARQLWrapperCLIParser_Test::testInvalidFormat - Ass... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryRDF - urllib.error.U... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryTo4store - urllib.er... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToAgrovoc_AllegroGraph FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToAllegroGraph - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToBrazeGraph - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToFuseki2V3_6 - urll... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToFuseki2V3_8 - urll... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToGraphDBEnterprise FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToLovFuseki - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToRDF4J - urllib.err... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToStardog - urllib.e... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToVirtuosoV7 - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToVirtuosoV8 - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithEndpoint - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFile - urllib.er... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileCSV - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileN3 - urllib.... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileRDFXML - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTSV - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtle - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtleQuiet FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileXML - urllib... FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testKeepAlive - u... FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testKeepAlive - urll... FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryBadFormed_1 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testKeepAlive - urll... FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinCSV - ur... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSON - u... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testKeepAlive - urllib... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryBadFormed - u... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_wrapper.py::QueryResult_Test::testConvert - AssertionError: ... = 858 failed, 38 passed, 549 skipped, 80 xfailed, 381 warnings in 728.34s (0:12:08) = E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build; python3.12 -m pytest test dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.12 returned exit code 13 make[1]: Leaving directory '/build/reproducible-path/sparql-wrapper-python-2.0.0' create-stamp debian/debhelper-build-stamp dh_testroot -O--buildsystem=pybuild dh_prep -O--buildsystem=pybuild dh_auto_install --destdir=debian/python3-sparqlwrapper/ -O--buildsystem=pybuild I: pybuild pybuild:308: rm -fr /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/test I: pybuild base:311: /usr/bin/python3 setup.py install --root /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper running install /usr/lib/python3/dist-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated. !! ******************************************************************************** Please avoid running ``setup.py`` directly. Instead, use pypa/build, pypa/installer or other standards-based tools. See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. ******************************************************************************** !! self.initialize_options() running build running build_py running install_lib creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/.pytest_cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/.pytest_cache/README.md -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/.pytest_cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/.pytest_cache/.gitignore -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/.pytest_cache creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/.pytest_cache/v creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/.pytest_cache/v/cache/nodeids -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/.pytest_cache/v/cache/stepwise -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/.pytest_cache/v/cache/lastfailed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/.pytest_cache/CACHEDIR.TAG -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/.pytest_cache creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/SmartWrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/__init__.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/main.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/Wrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/sparql_dataframe.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/__pycache__/Wrapper.cpython-312.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/__pycache__/SPARQLExceptions.cpython-312.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/__pycache__/main.cpython-312.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/__pycache__/SmartWrapper.cpython-312.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/__pycache__/KeyCaseInsensitiveDict.cpython-312.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/__pycache__/__init__.cpython-312.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/__pycache__/sparql_dataframe.cpython-312.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.12_sparqlwrapper/build/SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/SmartWrapper.py to SmartWrapper.cpython-312.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/__init__.py to __init__.cpython-312.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/main.py to main.cpython-312.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/SPARQLExceptions.py to SPARQLExceptions.cpython-312.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/Wrapper.py to Wrapper.cpython-312.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/KeyCaseInsensitiveDict.py to KeyCaseInsensitiveDict.cpython-312.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper/sparql_dataframe.py to sparql_dataframe.cpython-312.pyc running install_egg_info running egg_info creating SPARQLWrapper.egg-info writing SPARQLWrapper.egg-info/PKG-INFO writing dependency_links to SPARQLWrapper.egg-info/dependency_links.txt writing entry points to SPARQLWrapper.egg-info/entry_points.txt writing requirements to SPARQLWrapper.egg-info/requires.txt writing top-level names to SPARQLWrapper.egg-info/top_level.txt writing manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' reading manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files found matching 'Makefile' warning: no directories found matching 'docs/build/html' adding license file 'LICENSE.txt' adding license file 'AUTHORS.md' writing manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' Copying SPARQLWrapper.egg-info to /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.12/dist-packages/SPARQLWrapper-2.0.0.egg-info Skipping SOURCES.txt running install_scripts Installing rqw script to /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/bin dh_installdocs -O--buildsystem=pybuild dh_installchangelogs -O--buildsystem=pybuild dh_installexamples -O--buildsystem=pybuild dh_python3 -O--buildsystem=pybuild dh_installsystemduser -O--buildsystem=pybuild dh_perl -O--buildsystem=pybuild dh_link -O--buildsystem=pybuild dh_strip_nondeterminism -O--buildsystem=pybuild dh_compress -O--buildsystem=pybuild dh_fixperms -O--buildsystem=pybuild dh_missing -O--buildsystem=pybuild dh_installdeb -O--buildsystem=pybuild dh_gencontrol -O--buildsystem=pybuild dh_md5sums -O--buildsystem=pybuild dh_builddeb -O--buildsystem=pybuild dpkg-deb: building package 'python3-sparqlwrapper' in '../python3-sparqlwrapper_2.0.0-2_all.deb'. dpkg-genbuildinfo --build=binary -O../sparql-wrapper-python_2.0.0-2_amd64.buildinfo dpkg-genchanges --build=binary -O../sparql-wrapper-python_2.0.0-2_amd64.changes dpkg-genchanges: info: binary-only upload (no source code included) dpkg-source --after-build . dpkg-buildpackage: info: binary-only upload (no source included) dpkg-genchanges: info: not including original source code in upload I: copying local configuration I: user script /srv/workspace/pbuilder/2595058/tmp/hooks/B01_cleanup starting I: user script /srv/workspace/pbuilder/2595058/tmp/hooks/B01_cleanup finished I: unmounting dev/ptmx filesystem I: unmounting dev/pts filesystem I: unmounting dev/shm filesystem I: unmounting proc filesystem I: unmounting sys filesystem I: cleaning the build env I: removing directory /srv/workspace/pbuilder/2595058 and its subdirectories I: Current time: Wed Jan 21 13:51:15 +14 2026 I: pbuilder-time-stamp: 1768953075